翻訳と辞書
Words near each other
・ Variations on a Nursery Tune (Dohnányi)
・ Variations on a Rococo Theme
・ Variations on a Theme
・ Variations on a Theme (David Thomas album)
・ Variations on a Theme (Om album)
・ Variations on a Theme by Haydn
・ Variations on a Theme by Tchaikovsky (Arensky)
・ Variable-length array
・ Variable-length buffer
・ Variable-length code
・ Variable-length intake manifold
・ Variable-length quantity
・ Variable-mass system
・ Variable-message sign
・ Variable-order Bayesian network
Variable-order Markov model
・ Variable-pitch propeller
・ Variable-position horizontal stabilizer
・ Variable-range hopping
・ Variable-Rate Multimode Wideband
・ Variable-speed air compressor
・ Variable-sweep wing
・ Variable-width encoding
・ Variables sampling plan
・ Variably Modified Permutation Composition
・ Variably protease-sensitive prionopathy
・ Variaciones Espectrales
・ Variadic
・ Variadic function
・ Variadic macro


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Variable-order Markov model : ウィキペディア英語版
Variable-order Markov model
In stochastic processes, Variable-order Markov (VOM) models are an important class of models that extend the well known Markov chain models. In contrast to the Markov chain models, where each random variable in a sequence with a Markov property depends on a fixed number of random variables, in VOM models this number of conditioning random variables may vary based on the specific observed realization.
This realization sequence is often called the ''context''; therefore the VOM models are also called ''context trees''. The flexibility in the number of conditioning random variables turns out to be of real advantage for many applications, such as statistical analysis, classification and prediction.
==Example==

Consider for example a sequence of random variables, each of which takes a value from the ternary alphabet . Specifically, consider the string ''aaabcaaabcaaabcaaabc...aaabc'' constructed from infinite concatenations of the sub-string ''aaabc''.
The VOM model of maximal order 2 can approximate the above string using ''only'' the following five conditional probability components: .
In this example, Pr(''c''|''ab'') = Pr(''c''|''b'') = 1.0; therefore, the shorter context ''b'' is sufficient to determine the next character. Similarly, the VOM model of maximal order 3 can generate the string exactly using only five conditional probability components, which are all equal to 1.0.

To construct the Markov chain of order 1 for the next character in that string, one must estimate the following 9 conditional probability components: . To construct the Markov chain of order 2 for the next character, one must estimate 27 conditional probability components: . And to construct the Markov chain of order three for the next character one must estimate the following 81 conditional probability components: .
In practical settings there is seldom sufficient data to accurately estimate the exponentially increasing number of conditional probability components as the order of the Markov chain increases.
The variable-order Markov model assumes that in realistic settings, there are certain realizations of states (represented by contexts) in which some past states are independent from the future states; accordingly, "a great reduction in the number of model parameters can be achieved."〔

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Variable-order Markov model」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.